The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Autonomic computing generally refers to future information processing and networking technologies that are capable of self-awareness for the purposes of self-optimization, self-healing and self-protection. This paper is an overview of the goals, motivations and current status of this technical area, with specific focus on the technical and deployment challenges. Our conclusion is that, while the imperative...
Software technology has evolved rapidly to satisfy the demands from the expansion of various applications of information technology. From the applications perspective, the demands for producing high-quality software for real-time systems, distributed computing infrastructure, internet, mobile and ubiquitous (or pervasive) computing systems, etc. are enormous. The quality involved may include not only...
This paper argues that software architecture has always been the enabler for the ever-upward spiral in software engineering of more powerful problem-space languages, and will continue to be so
This paper envisions a new computing paradigm that promises a revolution in the ubiquity of first class computing devices. This revolution will lead to a plethora of new applications, opportunities and challenges for software developers
The productivity and dependability challenges we face in software engineering remain significant. Software has become the pacing item and an unexpectedly large expense in the development of many engineered systems. Dealing with this situation will require a radical change in the way we build software, and one candidate to form the basis of that change is model-based development. Model-based development...
Granular Computing is a new computing paradigm. The panel will focus on its potentials on E-security and web intelligence among others. One of the main component of granular computing is "Granulate and Conquer," that lies in the heart of this conference. Fruitful discussions and conclusions will be expected.
What is granular computing? As the author have said in (T.Y. Lin, 2006): There are no mathematically valid formal definitions yet. Informally, any computing theory/technology that involves elements and granules (generalized subsets) may be called granular computing (GrC). Intuitively, elements are the data, and granules are the basic knowledge. So granular computing includes data and knowledge computing/engineering,...
As two related emerging fields of research, Web intelligence (WI) and brain informatics (BI) mutually support each other. Their synergy will yield profound advances in the analysis and understanding of data, knowledge, intelligence and wisdom, as well as their relationships, organization and creation process. When WI meets BI, it is possible to have a unified and holistic framework for the study of...
This paper discusses foundations of conventional style of rule mining in which rules are extracted from a data table. Rule mining mainly uses the structure of a table, data partition, but two different approaches are observed: divide and conquer and covering: the former focuses on the nature of data partition and the latter does on the nature of information granules. This paper illustrates that granular...
We, as individuals, as well as governments, corporations, and institutions, form a networked society and we are increasingly dependent on that network. The very fabric of our everyday life and business utilizes this networked connectivity, particularly critical infrastructures such as the electric power grid, oil and gas pipeline and distribution systems, telecommunications, transportation, and water...
We have become increasingly dependent on a national technological fabric that contains software, computers, and communication networks as essential components. Unfortunately, our ability to build affordable software systems for which there exists compelling evidence that the systems delivers their services in a manner that satisfies certain critical properties has not kept pace with the importance...
Summary form only given. The very label "critical infrastructure" implies that such systems are important. They are. Within the US alone, there are approximately 28,600 networked Federal Deposit Insurance Corporation (FDIC) institutions, 2 million miles of oil/gas pipelines, 2,800 power plants with 300,000 production sites, 104 nuclear power plants, 80,000 dams, 60,000 chemical plants, 87,000...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.